Metric-Distortion Bounds Under Limited Information

نویسندگان

چکیده

In this work we study the metric distortion problem in voting theory under a limited amount of ordinal information. Our primary contribution is threefold. First, consider mechanisms which perform sequence pairwise comparisons between candidates. We show that widely-popular deterministic mechanism employed most knockout phases yields \(\mathcal {O}(\log m)\) while eliciting only \(m-1\) out \(\varTheta (m^2)\) possible comparisons, where m represents number also provide matching lower bound on its distortion. contrast, any performs fewer than has unbounded Moreover, power incomplete rankings. Most notably, when every agent provides her k-top preferences an upper \(6 m/k + 1\) distortion, for \(k \in \{1, 2, \dots , m\}\), substantially improving over previous 12m/k recently established by Kempe [25, 26]. Finally, are concerned with sample complexity required to ensure near-optimal high probability. main random (m/\epsilon ^2)\) voters suffices guarantee \(3 \epsilon \) probability, sufficiently small \(\epsilon > 0\). This result based analyzing sensitivity introduced Gkatzelis, Halpern, and Shah [22].

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning under limited information

We study how human subjects learn under extremely limited information. We use Chen’s (forthcoming) data on cost sharing games, and Van Huyck et al.’s (1996, Manuscript) data on coordination games to compare three payoff-based learning models. Under the serial mechanism and coordination games, the payoff-assessment learning model (Sarin and Vahid, 1999, Games Econ. Behav. 28, 294–309) tracks the...

متن کامل

Distortion in the Spherical Metric under Quasiconformal Mappings

This paper contains bounds for the distortion in the spherical metric, that is to say, bounds for the constant of Hölder continuity of mappings f : (Rn, q) → (Rn, q) where q denotes the spherical metric. The mappings considered are K-quasiconformal (K ≥ 1) and satisfy some normalizations or restrictions. All bounds are explicit and asymptotically sharp as K → 1.

متن کامل

Mutual information rate, distortion, and quantization in metric spaces

A&mct-Several new properties as well as simplified proofs of known propertks are developed for the mutual information rate between disaetetlme random processa whosedphabeQ~Borelsubse*lOf complete sepamble met& speces. In particulpr, the apymptotic properties of quantizers for such spaces provide a link with finitMlphabet processes and yield the ergodic decomposition of mutual information rate. ...

متن کامل

Bounded-Distortion Metric Learning

Metric learning aims to embed one metric space into another to benefit tasks like classification and clustering. Although a greatly distorted metric space has a high degree of freedom to fit training data, it is prone to overfitting and numerical inaccuracy. This paper presents bounded-distortion metric learning (BDML), a new metric learning framework which amounts to finding an optimal Mahalan...

متن کامل

Towards Lower Bounds on Embedding Distortion in Information Hiding

We propose two efficient information hiding algorithms in the least significant bits of JPEG coefficients of images. Our algorithms embed information by modifying JPEG coefficients in such a way that the introduced distortion is minimized. We derive the expected value of the additional error due to distortion as a function of the message length and the probability distribution of the JPEG quant...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Lecture Notes in Computer Science

سال: 2021

ISSN: ['1611-3349', '0302-9743']

DOI: https://doi.org/10.1007/978-3-030-85947-3_20